375 research outputs found

    Attention is All They Need: Exploring the Media Archaeology of the Computer Vision Research Paper

    Full text link
    The success of deep learning has led to the rapid transformation and growth of many areas of computer science, including computer vision. In this work, we examine the effects of this growth through the computer vision research paper itself by analyzing the figures and tables in research papers from a media archaeology perspective. We ground our investigation both through interviews with veteran researchers spanning computer vision, graphics and visualization, and computational analysis of a decade of vision conference papers. Our analysis focuses on elements with roles in advertising, measuring and disseminating an increasingly commodified "contribution." We argue that each of these elements has shaped and been shaped by the climate of computer vision, ultimately contributing to that commodification. Through this work, we seek to motivate future discussion surrounding the design of the research paper and the broader socio-technical publishing system

    Are Metrics Enough? Guidelines for Communicating and Visualizing Predictive Models to Subject Matter Experts

    Full text link
    Presenting a predictive model's performance is a communication bottleneck that threatens collaborations between data scientists and subject matter experts. Accuracy and error metrics alone fail to tell the whole story of a model - its risks, strengths, and limitations - making it difficult for subject matter experts to feel confident in their decision to use a model. As a result, models may fail in unexpected ways or go entirely unused, as subject matter experts disregard poorly presented models in favor of familiar, yet arguably substandard methods. In this paper, we describe an iterative study conducted with both subject matter experts and data scientists to understand the gaps in communication between these two groups. We find that, while the two groups share common goals of understanding the data and predictions of the model, friction can stem from unfamiliar terms, metrics, and visualizations - limiting the transfer of knowledge to SMEs and discouraging clarifying questions being asked during presentations. Based on our findings, we derive a set of communication guidelines that use visualization as a common medium for communicating the strengths and weaknesses of a model. We provide a demonstration of our guidelines in a regression modeling scenario and elicit feedback on their use from subject matter experts. From our demonstration, subject matter experts were more comfortable discussing a model's performance, more aware of the trade-offs for the presented model, and better equipped to assess the model's risks - ultimately informing and contextualizing the model's use beyond text and numbers

    HyperNP: Interactive Visual Exploration of Multidimensional Projection Hyperparameters

    Full text link
    Projection algorithms such as t-SNE or UMAP are useful for the visualization of high dimensional data, but depend on hyperparameters which must be tuned carefully. Unfortunately, iteratively recomputing projections to find the optimal hyperparameter value is computationally intensive and unintuitive due to the stochastic nature of these methods. In this paper we propose HyperNP, a scalable method that allows for real-time interactive hyperparameter exploration of projection methods by training neural network approximations. HyperNP can be trained on a fraction of the total data instances and hyperparameter configurations and can compute projections for new data and hyperparameters at interactive speeds. HyperNP is compact in size and fast to compute, thus allowing it to be embedded in lightweight visualization systems such as web browsers. We evaluate the performance of the HyperNP across three datasets in terms of performance and speed. The results suggest that HyperNP is accurate, scalable, interactive, and appropriate for use in real-world settings

    RekomGNN: Visualizing, Contextualizing and Evaluating Graph Neural Networks Recommendations

    Full text link
    Content recommendation tasks increasingly use Graph Neural Networks, but it remains challenging for machine learning experts to assess the quality of their outputs. Visualization systems for GNNs that could support this interrogation are few. Moreover, those that do exist focus primarily on exposing GNN architectures for tuning and prediction tasks and do not address the challenges of recommendation tasks. We developed RekomGNN, a visual analytics system that supports ML experts in exploring GNN recommendations across several dimensions and making annotations about their quality. RekomGNN straddles the design space between Neural Network and recommender system visualization to arrive at a set of encoding and interaction choices for recommendation tasks. We found that RekomGNN helps experts make qualitative assessments of the GNN's results, which they can use for model refinement. Overall, our contributions and findings add to the growing understanding of visualizing GNNs for increasingly complex tasks

    High-resolution age modelling of peat bogs from northern Alberta, Canada, using pre- and post-bomb 14 C, 210 Pb and historical cryptotephra

    Get PDF
    High-resolution studies of peat profiles are frequently undertaken to investigate natural and anthropogenic disturbances over time. However, overlapping profiles of the most commonly applied age-dating techniques, including 14C and 210Pb, often show significant offsets (>decadal) and biases that can be difficult to resolve. Here we investigate variations in the chronometers and individual site histories from six ombrotrophic peat bogs in central and northern Alberta. Dates produced using pre- and post-bomb 14C, 210Pb (corroborated with 137Cs and 241Am), and cryptotephra peaks, are compared and then integrated using OxCal's P_Sequence function to produce a single Bayesian age model. Environmental histories for each site obtained using physical and chemical characteristics of the peat cores, e.g. plant macrofossils, humification, ash content and dry density, provide important constraints for the models by highlighting periods with significant changes in accumulation rate, e.g. fire events, permafrost development, and prolonged surficial drying. Despite variable environmental histories, it is possible to produce high-resolution age-depth models for each core sequence. Consistent offsets between 14C and 210Pb dates pre-1960s are seen at five of the six sites, but tephra-corrected 210Pb data can be used to produce more coherent models at three of these sites. Processes such as permafrost development and thaw, surficial drying and local fires can disrupt the normal processes by which chronological markers and environmental records are incorporated in the peat record. In consequence, applying standard dating methodologies to these records will result in even greater uncertainties and discrepancies between the different dating tools. These results show that using any single method to accurately date peat profiles where accumulation has not been uniform over time may be unreliable, but a comprehensive multi-method investigation paired with the application of Bayesian statistics can produce more robust chronologies. New cryptotephra data for the Alberta region are also reported here, including the historical Novarupta-Katmai 1912 eruption, White River Ash (East), and glass from Mt. St. Helens, Mt. Churchill, and probable Aleutian sources

    Peat bogs in northern Alberta, Canada reveal decades of declining atmospheric Pb contamination

    Get PDF
    Peat cores were collected from six bogs in northern Alberta to reconstruct changes in the atmospheric deposition of Pb, a valuable tracer of human activities. In each profile, the maximum Pb enrichment is found well below the surface. Radiometric age dating using three independent approaches (14C measurements of plant macrofossils combined with the atmospheric bomb pulse curve, plus 210Pb confirmed using the fallout radionuclides 137Cs and 241Am) showed that Pb contamination has been in decline for decades. Today, the surface layers of these bogs are comparable in composition to the "cleanest" peat samples ever found in the Northern Hemisphere, from a Swiss bog ~ 6000 to 9000years old. The lack of contemporary Pb contamination in the Alberta bogs is testimony to successful international efforts of the past decades to reduce anthropogenic emissions of this potentially toxic metal to the atmosphere

    Peat Bogs Document Decades of Declining Atmospheric Contamination by Trace Metals in the Athabasca Bituminous Sands Region

    Get PDF
    Peat cores were collected from five bogs in the vicinity of open pit mines and upgraders of the Athabasca Bituminous Sands, the largest reservoir of bitumen in the world. Frozen cores were sectioned into 1 cm slices, and trace metals determined in the ultraclean SWAMP lab using ICP-QMS. The uppermost sections of the cores were age-dated with <sup>210</sup>Pb using ultralow background gamma spectrometry, and selected plant macrofossils dated using <sup>14</sup>C. At each site, trace metal concentrations as well as enrichment factors (calculated relative to the corresponding element/Th ratio of the Upper Continental Crust) reveal maximum values 10 to 40 cm below the surface which shows that the zenith of atmospheric contamination occurred in the past. The age-depth relationships show that atmospheric contamination by trace metals (Ag, Cd, Sb, Tl, but also V, Ni, and Mo which are enriched in bitumen) has been declining in northern Alberta for decades. In fact, the greatest contemporary enrichments of Ag, Cd, Sb, and Tl (in the top layers of the peat cores) are found at the control site (Utikuma) which is 264 km SW, suggesting that long-range atmospheric transport from other sources must be duly considered in any source assessment

    Bose-Einstein correlations of same-sign charged pions in the forward region in pp collisions at √s=7 TeV

    Get PDF
    Bose-Einstein correlations of same-sign charged pions, produced in protonproton collisions at a 7 TeV centre-of-mass energy, are studied using a data sample collected by the LHCb experiment. The signature for Bose-Einstein correlations is observed in the form of an enhancement of pairs of like-sign charged pions with small four-momentum difference squared. The charged-particle multiplicity dependence of the Bose-Einstein correlation parameters describing the correlation strength and the size of the emitting source is investigated, determining both the correlation radius and the chaoticity parameter. The measured correlation radius is found to increase as a function of increasing charged-particle multiplicity, while the chaoticity parameter is seen to decreas

    Measurement of the inelastic pp cross-section at a centre-of-mass energy of 13TeV

    Get PDF
    The cross-section for inelastic proton-proton collisions at a centre-of-mass energy of 13TeV is measured with the LHCb detector. The fiducial cross-section for inelastic interactions producing at least one prompt long-lived charged particle with momentum p &gt; 2 GeV/c in the pseudorapidity range 2 &lt; η &lt; 5 is determined to be ϭ acc = 62:2 ± 0:2 ± 2:5mb. The first uncertainty is the intrinsic systematic uncertainty of the measurement, the second is due to the uncertainty on the integrated luminosity. The statistical uncertainty is negligible. Extrapolation to full phase space yields the total inelastic proton-proton cross-section ϭ inel = 75:4 ± 3:0 ± 4:5mb, where the first uncertainty is experimental and the second due to the extrapolation. An updated value of the inelastic cross-section at a centre-of-mass energy of 7TeV is also reported

    WISE x SuperCOSMOS photometric redshift catalog: 20 million galaxies over 3pi steradians

    Get PDF
    We cross-match the two currently largest all-sky photometric catalogs, mid-infrared WISE and SuperCOSMOS scans of UKST/POSS-II photographic plates, to obtain a new galaxy sample that covers 3pi steradians. In order to characterize and purify the extragalactic dataset, we use external GAMA and SDSS spectroscopic information to define quasar and star loci in multicolor space, aiding the removal of contamination from our extended-source catalog. After appropriate data cleaning we obtain a deep wide-angle galaxy sample that is approximately 95% pure and 90% complete at high Galactic latitudes. The catalog contains close to 20 million galaxies over almost 70% of the sky, outside the Zone of Avoidance and other confused regions, with a mean surface density of over 650 sources per square degree. Using multiwavelength information from two optical and two mid-IR photometric bands, we derive photometric redshifts for all the galaxies in the catalog, using the ANNz framework trained on the final GAMA-II spectroscopic data. Our sample has a median redshift of z_{med} = 0.2 but with a broad dN/dz reaching up to z>0.4. The photometric redshifts have a mean bias of |delta_z|~10^{-3}, normalized scatter of sigma_z = 0.033 and less than 3% outliers beyond 3sigma_z. Comparison with external datasets shows no significant variation of photo-z quality with sky position. Together with the overall statistics, we also provide a more detailed analysis of photometric redshift accuracy as a function of magnitudes and colors. The final catalog is appropriate for `all-sky' 3D cosmology to unprecedented depths, in particular through cross-correlations with other large-area surveys. It should also be useful for source pre-selection and identification in forthcoming surveys such as TAIPAN or WALLABY
    • 

    corecore